Sponsored by Deepsite.site

Codex Mcp Server

Created By
Taras Trishhuk15 days ago
MCP server bridging AI assistants to OpenAI Codex CLI for code analysis and review
Overview

Codex MCP Tool

GitHub Release npm version npm downloads License: MIT

MCP server connecting Claude/Cursor to Codex CLI. Enables code analysis via @ file references, multi-turn conversations, sandboxed edits, and structured change mode.

Features

  • File Analysis — Reference files with @src/, @package.json syntax
  • Multi-Turn Sessions — Conversation continuity with workspace isolation
  • Native Resume — Uses codex resume for context preservation (CLI v0.36.0+)
  • Local OSS Models — Run with Ollama or LM Studio via localProvider
  • Web Search — Research capabilities with search: true
  • Sandbox Mode — Safe code execution with --full-auto
  • Change Mode — Structured OLD/NEW patch output for refactoring
  • Brainstorming — SCAMPER, design-thinking, lateral thinking frameworks
  • Health Diagnostics — CLI version, features, and session monitoring
  • Cross-Platform — Windows, macOS, Linux fully supported

Quick Start

claude mcp add codex-cli -- npx -y @trishchuk/codex-mcp-tool

Prerequisites: Node.js 18+, Codex CLI installed and authenticated.

Configuration

{
  "mcpServers": {
    "codex-cli": {
      "command": "npx",
      "args": ["-y", "@trishchuk/codex-mcp-tool"]
    }
  }
}

Config locations: macOS: ~/Library/Application Support/Claude/claude_desktop_config.json | Windows: %APPDATA%\Claude\claude_desktop_config.json

Usage Examples

// File analysis
'explain the architecture of @src/';
'analyze @package.json and list dependencies';

// With specific model
'use codex with model gpt-5.4 to analyze @algorithm.py';

// Multi-turn conversations (v1.4.0+)
'ask codex sessionId:"my-project" prompt:"explain @src/"';
'ask codex sessionId:"my-project" prompt:"now add error handling"';

// Brainstorming
'brainstorm ways to optimize CI/CD using SCAMPER method';

// Sandbox mode
'use codex sandbox:true to create and run a Python script';

// Web search
'ask codex search:true prompt:"latest TypeScript 5.7 features"';

// Local OSS model (Ollama)
'ask codex localProvider:"ollama" model:"qwen3:8b" prompt:"explain @src/"';

Tools

ToolDescription
ask-codexExecute Codex CLI with file analysis, models, sessions
brainstormGenerate ideas with SCAMPER, design-thinking, etc.
list-sessionsView/delete/clear conversation sessions
healthDiagnose CLI installation, version, features
ping / helpTest connection, show CLI help

Models

Default: gpt-5.4 with fallback → gpt-5.3-codexgpt-5.2-codexgpt-5.1-codex-maxgpt-5.2

ModelUse Case
gpt-5.4Latest frontier agentic coding (default)
gpt-5.3-codexFrontier agentic coding
gpt-5.2-codexFrontier agentic coding
gpt-5.1-codex-maxDeep and fast reasoning
gpt-5.1-codex-miniCost-efficient quick tasks
gpt-5.2Broad knowledge, reasoning and coding

Key Features

Session Management (v1.4.0+)

Multi-turn conversations with workspace isolation:

{ "prompt": "analyze code", "sessionId": "my-session" }
{ "prompt": "continue from here", "sessionId": "my-session" }
{ "prompt": "start fresh", "sessionId": "my-session", "resetSession": true }

Environment:

  • CODEX_SESSION_TTL_MS - Session TTL (default: 24h)
  • CODEX_MAX_SESSIONS - Max sessions (default: 50)

Local OSS Models (v1.6.0+)

Run with local Ollama or LM Studio instead of OpenAI:

// Ollama
{ "prompt": "analyze @src/", "localProvider": "ollama", "model": "qwen3:8b" }

// LM Studio
{ "prompt": "analyze @src/", "localProvider": "lmstudio", "model": "my-model" }

// Auto-select provider
{ "prompt": "analyze @src/", "oss": true }

Requirements: Ollama running locally with a model that supports tool calling (e.g. qwen3:8b).

Advanced Options

ParameterDescription
modelModel selection
sessionIdEnable conversation continuity
sandboxEnable --full-auto mode
searchEnable web search
changeModeStructured OLD/NEW edits
addDirsAdditional writable directories
toolOutputTokenLimitCap response verbosity (100-10,000)
reasoningEffortReasoning depth: low, medium, high, xhigh
ossUse local OSS model provider
localProviderLocal provider: lmstudio or ollama

CLI Compatibility

VersionFeatures
v0.60.0+GPT-5.2 model family
v0.59.0+--add-dir, token limits
v0.52.0+Native --search flag
v0.36.0+Native codex resume (sessions)

Troubleshooting

codex --version    # Check CLI version
codex login        # Authenticate

Use health tool for diagnostics: 'use health verbose:true'

Migration

v2.0.x → v2.1.0: gpt-5.4 as new default model, updated fallback chain.

v1.5.x → v1.6.0: Local OSS model support (localProvider, oss), gpt-5.3-codex default model, xhigh reasoning effort.

v1.3.x → v1.4.0: New sessionId parameter, list-sessions/health tools, structured error handling. No breaking changes.

License

MIT License. Not affiliated with OpenAI.


Documentation | Issues | Inspired by jamubc/gemini-mcp-tool

Server Config

{
  "mcpServers": {
    "codex-cli": {
      "command": "npx",
      "args": [
        "-y",
        "@trishchuk/codex-mcp-tool"
      ]
    }
  }
}
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
RedisA Model Context Protocol server that provides access to Redis databases. This server enables LLMs to interact with Redis key-value stores through a set of standardized tools.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
CursorThe AI Code Editor
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
ChatWiseThe second fastest AI chatbot™
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Serper MCP ServerA Serper MCP Server
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Playwright McpPlaywright MCP server
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Tavily Mcp
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
WindsurfThe new purpose-built IDE to harness magic
DeepChatYour AI Partner on Desktop
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Amap Maps高德地图官方 MCP Server